Learning Improved Semantic Representations with Tree-Structured LSTM for Hashtag Recommendation: An Experimental Study
نویسندگان
چکیده
منابع مشابه
Hashtag Recommendation with Topical Attention-Based LSTM
Microblogging services allow users to create hashtags to categorize their posts. In recent years, the task of recommending hashtags for microblogs has been given increasing attention. However, most of existing methods depend on hand-crafted features. Motivated by the successful use of long short-term memory (LSTM) for many natural language processing tasks, in this paper, we adopt LSTM to learn...
متن کاملBidirectional Tree-Structured LSTM with Head Lexicalization
Sequential LSTM has been extended to model tree structures, giving competitive results for a number of tasks. Existing methods model constituent trees by bottom-up combinations of constituent nodes, making direct use of input word information only for leaf nodes. This is different from sequential LSTMs, which contain reference to input words for each node. In this paper, we propose a method for...
متن کاملLocation-Aware Online Learning for Top-k Hashtag Recommendation
In this paper we investigate the problem of recommending Twitter hashtags for users with known GPS location, learning online from the stream of geo-tagged tweets. Our method learns the relevance of regions in a geographical hierarchy, combined with the local popularity of the hashtag. Unlike in typical collaborative filtering settings, trends and geolocation turns out to be more important than ...
متن کاملImproved Semantic Representations From Tree-Structured Long Short-Term Memory Networks
A Long Short-Term Memory (LSTM) network is a type of recurrent neural network architecture which has recently obtained strong results on a variety of sequence modeling tasks. The only underlying LSTM structure that has been explored so far is a linear chain. However, natural language exhibits syntactic properties that would naturally combine words to phrases. We introduce the Tree-LSTM, a gener...
متن کاملLearning Structured Natural Language Representations for Semantic Parsing
We introduce a neural semantic parser which converts natural language utterances to intermediate representations in the form of predicate-argument structures, which are induced with a transition system and subsequently mapped to target domains. The semantic parser is trained end-to-end using annotated logical forms or their denotations. We achieve the state of the art on SPADES and GRAPHQUESTIO...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Information
سال: 2019
ISSN: 2078-2489
DOI: 10.3390/info10040127